136 research outputs found
Using PlanetLab for network research: Myths, realities, and best practices
PlanetLab is a research testbed that supports 428 experiments on 276 sites, with 583 nodes in 30 countries. It has lowered the barrier to distributed experimentation in network measurement, peer-to-peer networks, content distribution
BitTorrent is an Auction: Analyzing and Improving BitTorrent’s Incentives, in:
ABSTRACT Incentives play a crucial role in BitTorrent, motivating users to upload to others to achieve fast download times for all peers. Though long believed to be robust to strategic manipulation, recent work has empirically shown that BitTorrent does not provide its users incentive to follow the protocol. We propose an auction-based model to study and improve upon BitTorrent's incentives. The insight behind our model is that BitTorrent uses, not tit-for-tat as widely believed, but an auction to decide which peers to serve. Our model not only captures known, performance-improving strategies, it shapes our thinking toward new, effective strategies. For example, our analysis demonstrates, counter-intuitively, that BitTorrent peers have incentive to intelligently under-report what pieces of the file they have to their neighbors. We implement and evaluate a modification to BitTorrent in which peers reward one another with proportional shares of bandwidth. Within our game-theoretic model, we prove that a proportional-share client is strategy-proof. With experiments on PlanetLab, a local cluster, and live downloads, we show that a proportional-share unchoker yields faster downloads against BitTorrent and BitTyrant clients, and that underreporting pieces yields prolonged neighbor interest
A Secure DHT via the Pigeonhole Principle
The standard Byzantine attack model assumes no more than some fixed
fraction of the participants are faulty. This assumption does not
accurately apply to peer-to-peer settings, where Sybil attacks and botnets
are realistic threats. We propose an attack model that permits an
arbitrary number of malicious nodes under the assumption that each node
can be classified based on some of its attributes, such as autonomous
system number or operating system, and that the number of classes with
malicious nodes is bounded (e.g., an attacker may exploit at most a few
operating systems at a time). In this model, we present a secure DHT,
evilTwin, which replaces a single, large DHT with sufficiently many
smaller instances such that it is impossible for an adversary to corrupt
every instance. Our system ensures high availability and low-latency
lookups, is easy to implement, does not require a complex Byzantine
agreement protocol, and its proof of security is a straightforward
application of the pigeonhole principle. The cost of security comes in the
form of increased storage and bandwidth overhead; we show how to reduce
these costs by replicating data and adaptively querying participants who
historically perform well. We use implementation and simulation to show
that evilTwin imposes a relatively small additional cost compared to
conventional DHTs
The Somatic Genomic Landscape of Glioblastoma
We describe the landscape of somatic genomic alterations based on multi-dimensional and comprehensive characterization of more than 500 glioblastoma tumors (GBMs). We identify several novel mutated genes as well as complex rearrangements of signature receptors including EGFR and PDGFRA. TERT promoter mutations are shown to correlate with elevated mRNA expression, supporting a role in telomerase reactivation. Correlative analyses confirm that the survival advantage of the proneural subtype is conferred by the G-CIMP phenotype, and MGMT DNA methylation may be a predictive biomarker for treatment response only in classical subtype GBM. Integrative analysis of genomic and proteomic profiles challenges the notion of therapeutic inhibition of a pathway as an alternative to inhibition of the target itself. These data will facilitate the discovery of therapeutic and diagnostic target candidates, the validation of research and clinical observations and the generation of unanticipated hypotheses that can advance our molecular understanding of this lethal cancer
Abstract Reverse Engineering the Internet
Understanding the structure and design of the Internet is is increasingly important as we seek to improve its reliability and robustness. At the same time, as the network grows in scale and diversity, a complete and accurate view is increasingly hard to come by. In this paper, we present a framework for classifying network measurement tools by how they can contribute to network mapping and whole-Internet analysis. We describe techniques to accommodate the scale and heterogeneity of the network. While most network measurement tools focus on performance and pathologies, we focus on developing tools to understand structure, design, and routing policy.
- …